This article provides an overview of evolving Australian records continuum theory and the records continuum model, which is interpreted as both a metaphor and a new worldview, representing a paradigm shift in Kuhn's sense. It is based on a distillation of research findings drawn from discourse, literary warrant and historical analysis, as well as case studies, participant observation and reflection. The article traces the emergence in Australia in the 1990s of a community of practice which has taken continuum rather than life cycle based perspectives, and adopted postcustodial approaches to recordkeeping and archiving. It "places" the evolution of records continuum theory and practice in Australia in the context of a larger international discourse that was reconceptualizing traditional theory, and "reinventing" records and archives practice.
Publisher
Kluwer Academic Publishers
Publication Location
The Netherlands
Critical Arguements
CA Looks at the development of the Australian community of practice that led to records continuum theory: an approach that, in contrast to the North American life cycle approach, sees recordkeeping and archival practices as part of the same continuum of activities. Since the 1990s, there has been a lively debate between proponents of these two different ways of thinking. The second part of the article is highly theoretical, situating records continuum theory in the larger intellectual trend toward postmodernism and postpositivism.
Phrases
<P1> The model was built on a unifying concept of records inclusive of archives, which are defined as records of continuing value. It also drew on ideas about the "fixed" and "mutable" nature of records, the notion that records are ÔÇ£always in a process of becoming." (p. 334). <P2> Continuum ideas about the nature of records and archives challenge traditional understandings which differentiate "archives" from "records" on the basis of selection for permanent preservation in archival custody, and which focus on their fixed nature. Adopting a pluralist view of recorded information, continuum thinking characterises records as a special genre of documents in terms of their intent and functionality. It emphasises their evidentiary, transactional and contextual nature, rejecting approaches to the definition of records which focus on their subject content and informational value. (p. 335) <P3> [R]ecordkeeping and archiving processes ... help to assure the accessibility of meaningful records for as long as they are of value to people, organisations, and societies ÔÇô whether that be for a nanosecond or millennia. (p. 336) <P4> [I]f North American understandings of the term record keeping, based on life cycle concepts of records management, are used to interpret the writings of members of the Australian recordkeeping community, there is considerable potential for misunderstanding. <P5> Members of the recordkeeping and archiving community have worked together, often in partnership with members of other records and archives communities, on a range of national policy and standards initiatives, particularly in response to the challenge of electronic recordkeeping. These collaborative efforts resulted in AS 4390, the Australian Standard: Records Management (1996), the Australian Council of Archives' Common Framework for Electronic Recordkeeping (1996), and the Australian Records and Archives Competency Standards (1997). In a parallel and interconnected development, individual archival organisations have been developing electronic recordkeeping policies, standards, system design methodologies, and implementation strategies for their jurisdictions, including the National Archives of Australia's suite of standards, policies, and guidelines under the e-permanence initiative launched in early 2000. These developments have been deliberately set within the broader context of national standards and policy development frameworks. Two of the lead institutions in these initiatives are the National Archives of Australia and the State Records Authority of New South Wales, which have based their work in this area on exploration of fundamental questions about the nature of records and archives, and the role of recordkeeping and archiving in society. <warrant> (p. 339) <P6> In adopting a continuum-based worldview and defining its "place" in the world, the Australian recordkeeping and archiving community consciously rejected the life cycle worldview that had dominated records management and archives practice in the latter half of the 20th century in North America. ... They were also strong advocates of the nexus between accountable recordkeeping and accountability in a democratic society, and supporters of the dual role of an archival authority as both a regulator of current recordkeeping, and preserver of the collective memory of the state/nation. (p. 343-344) <P7> [P]ost-modern ideas about records view them as dynamic objects that are fixed in terms of content and meaningful elements of their structure, but linked to ever-broadening layers of contextual metadata that manages their meanings, and enables their accessibility and useability as they move through "spacetime." (p. 349) <P8> In exploring the role of recordkeeping and archiving professionals within a postcustodial frame of reference, archival theorists such as Brothman, Brown, Cook, Harris, Hedstrom, Hurley, Nesmith, and Upward have concluded that they are an integral part of the record and archive making and keeping process, involved in society's remembering and forgetting. (p. 355) <P9> Writings on the societal context of functional appraisal have gone some way to translate into appraisal policies and strategies the implications of the shifts in perception away from seeing records managers as passive keepers of documentary detritus ... and archivists as Jenkinson's neutral, impartial custodians of inherited records. (p. 355-356)
Conclusions
RQ "By attempting to define, to categorise, pin down, and represent records and their contexts of creation, management, and use, descriptive standards and metadata schema can only ever represent a partial view of the dynamic, complex, and multi-dimensional nature of records, and their rich webs of contextual and documentary relationships. Within these limitations, what recordkeeping metadata research is reaching towards are ways to represent records and their contexts as richly and extensively as possible, to develop frameworks that recognise their mutable and contingent nature, as well as the role of recordkeeping and archiving professionals (records managers and archivists) in their creation and evolution, and to attempt to address issues relating to time and space." (p. 354)
Type
Electronic Journal
Title
Review: Some Comments on Preservation Metadata and the OAIS Model
CA Criticizes some of the limitations of OAIS and makes suggestions for improvements and clarifications. Also suggests that OAIS may be too library-centric, to the determinent of archival and especially recordkeeping needs. "In this article I have tried to articulate some of the main requirements for the records and archival community in preserving (archival) records. Based on this, the conclusion has to be that some adaptations to the [OAIS] model and metadata set would be necessary to meet these requirements. This concerns requirements such as the concept of authenticity of records, information on the business context of records and on relationships between records ('documentary context')."(p. 20)
Phrases
<P1> It requires records managers and archivists (and perhaps other information professionals) to be aware of these differences [in terminology] and to make a translation of such terms to their own domain. (p. 15) <P2> When applying the metadata model for a wider audience, more awareness of the issue of terminology is required, for instance by including clear definitions of key terms. (p. 15) <P3> The extent to which the management of objects can be influenced differs with respect to the type of objects. In the case of (government) records, legislation governs their creation and management, whereas, in the case of publications, the influence will be mostly based on agreements between producers, publishers and preservers. (p. 16) <P4> [A]lthough the suggestion may sometimes be otherwise, preservation metadata do not only apply to what is under the custody of a cultural or other preserving institution, but should be applied to the whole lifecycle of digital objects. ... Preservation can be viewed as part of maintenance. <warrant> (p. 16) <P5> [B]y taking library community needs as leading (albeit implicitly), the approach is already restricting the types of digital objects. Managing different types of 'digital objects', e.g. publications and records, may require not entirely similar sets of metadata. (p. 16) <P6> Another issue is that of the requirements governing the preservation processes. ... There needs to be insight and, as a consequence, also metadata about the preservation strategies, policies and methods, together with the context in which the preservation takes place. <warrant> (p. 16) <P7> [W]hat do we want to preserve? Is it the intellectual content with the functionality it has to have in order to make sense and achieve its purpose, or is it the digital components that are necessary to reproduce it or both? (p. 16-17) <P8> My view is that 'digital objects' should be seen as objects having both conceptual and technical aspects that are closely interrelated. As a consequence of the explanation given above, a digital object may consist of more than one 'digital component'. The definition given in the OAIS model is therefore insufficient. (p. 17) <P9> [W]e have no fewer than five metadata elements that could contain information on what should be rendered and presented on the screen. How all these elements relate to each other, if at all, is unclear. (p. 17) <P10> What we want to achieve ... is that in the future we will still be able to see, read and understand the documents or other information entities that were once produced for a certain purpose and in a certain context. In trying to achieve this, we of course need to preserve these digital components, but, as information technology will evolve, these components have to be migrated or in some cases emulated to be usable on future hard- and software platforms. (p. 17) <P11> I would like to suggest including an element that reflects the original technical environment. (p. 18) <P12> Records, according to the recently published ISO records management standard 15489, are 'information created, received and maintained as evidence and information by an organisation or person, in pursuance of legal obligations or in the transaction of business'. ... The main requirements for records to serve as evidence or authoritative information sources are ... authenticity and integrity, and knowledge about the business context and about the interrelationship between records (e.g. in a case file). <warrant> (p. 18) <P13> It would have been helpful if there had been more acknowledgement of the issue of authenticity and the requirements for it, and if the Working Group had provided some background information about its view and considerations on this aspect and to what extent it is included or not. (p. 19) <P14> In order to be able to preserve (archival) records it will ... be necessary to extend the information model with another class of information that refers to business context. Such a subset could provide a structure for describing what in archival terminology is called information about 'provenance' (with a different meaning from that in OAIS). (p. 19) <P15> In order to accommodate the identified complexity it is necessary to distinguish at least between the following categories of relationships: relationships between intellectual objects ... in the archival context this is referred to as 'documentary context'; relationships between the (structural) components of one intellectual object ... ; [and] relationships between digital components. (p. 19-20) <P16> [T]he issue of appraisal and disposition of records has to be included. In this context the recently published records management standard (ISO 15489) may serve as a useful framework. It would make the OAIS model even more widely applicable. (p. 20)
Conclusions
RQ "There are some issues ... which need further attention. They concern on the one hand the scope and underlying concepts of the OAIS model and the resulting metadata set as presented, and on the other hand the application of the model and metadata set in a records and archival environment. ... [T]he distinction between physical and conceptual or intellectual aspects of a digital object should be made more explicit and will probably have an impact on the model and metadata set also. More attention also needs to be given to the relationship between the (preservation) processes and the metadata. ... In assessing the needs of the records and archival community, the ISO records management standard 15489 may serve as a very useful framework. Such an exercise would also include a test for applicability of the model and metadata set for record-creating organisations and, as such, broaden the view of the OAIS model." (p. 20)
SOW
DC OAIS emerged out of an initiative spearheaded by NASA's Consultative Committee for Space Data Systems. It has been shaped and promoted by the RLG and OCLC. Several international projects have played key roles in shaping the OAIS model and adapting it for use in libraries, archives and research repositories. OAIS-modeled repositories include the CEDARS Project, Harvard's Digital Repository, Koninklijke Bibliotheek (KB), the Library of Congress' Archival Information Package for audiovisual materials, MIT's D-Space, OCLC's Digital Archive and TERM: the Texas Email Repository Model.
CA Describes efforts undertaken at the National Library of New Zealand to ensure preservation of electronic resources.
Phrases
<P1> The National Library Act 1965 provides the legislative framework for the National Library of New Zealand '... to collect, preserve, and make available recorded knowledge, particularly that relating to New Zealand, to supplement and further the work of other libraries in New Zealand, and to enrich the cultural and economic life of New Zealand and its cultural interchanges with other nations.' Legislation currently before Parliament, if enacted, will give the National Library the mandate to collect digital resources for preservation purposes. <warrant> (p. 18) <P2> So, the Library has an organisational commitment and may soon have the legislative environment to support the collection, management and preservation of digital objects. ... The next issue is what needs to be done to ensure that a viable preservation programme can actually be put in place. (p. 18) <P3> As the Library had already begun systematising its approach to resource discovery metadata, development of a preservation metadata schema for use within the Library was a logical next step. (p. 18) <P4> Work on the schema was initially informed by other international endeavours relating to preservation metadata, particularly that undertaken by the National Library of Australia. Initiatives through the CEDARS programme, OCLC/RLG activities and the emerging consensus regarding the role of the OAIS Reference Model ... were also taken into account. <warrant> (p. 18-19) <P5> The Library's Preservation Metadata schema is designed to strike a balance between the principles of preservation metadata, as expressed through the OAIS Information Model, and the practicalities of implementing a working set of preservation metadata. The same incentive informs a recent OCLC/RLG report on the OAIS model. (p. 19) <P6> [I]t is unlikely that anything resembling a comprehensive schema will become available in the short term. However, the need is pressing. (p. 19) <P7> The development of the preservation metadata schema is one component of an ongoing programme of activities needed to ensure the incorporation of digital material into the Library's core business processes with a view to the long-term accessibility of those resources. <warrant> (p. 19) <P8> The aim of the above activities is for the Library to be acknowledged as a 'trusted repository' for digital material which ensures the viability and authenticity of digital objects over time. (p. 20) <P9> The Library will also have to develop relationships with other organisations that might wish to achieve 'trusted repository' status in a country with a small population base and few agencies of appropriate size, funding and willingness to take on the role.
Conclusions
RQ There are still a number of important issues to be resolved before the Library's preservation programme can be deemed a success, including the need for: higher level of awareness of the need for digital preservation within the community of 'memory institutions' and more widely; metrics regarding the size and scope of the problem; finance to research and implement digital preservation; new skill sets for implementing digital preservation, e.g. running the multiplicity of hardware/software involved, digital conservation/archaeology; agreed international approaches to digital preservation; practical models to match the high level conceptual work already undertaken internationally; co-operation/collaboration between the wider range of agents potentially able to assist in developing digital preservation solutions, e.g. the computing industry; and, last but not least, clarity around intellectual property, copyright, privacy and moral rights.
SOW
DC OAIS emerged out of an initiative spearheaded by NASA's Consultative Committee for Space Data Systems. It has been shaped and promoted by the RLG and OCLC. Several international projects have played key roles in shaping the OAIS model and adapting it for use in libraries, archives and research repositories. OAIS-modeled repositories include the CEDARS Project, Harvard's Digital Repository, Koninklijke Bibliotheek (KB), the Library of Congress' Archival Information Package for audiovisual materials, MIT's D-Space, OCLC's Digital Archive and TERM: the Texas Email Repository Model.
Type
Electronic Journal
Title
The Warwick Framework: A container architecture for diverse sets of metadata
This paper is a abbreviated version of The Warwick Framework: A Container Architecture for Aggregating Sets of Metadata. It describes a container architecture for aggregating logically, and perhaps physically, distinct packages of metadata. This "Warwick Framework" is the result of the April 1996 Metadata II Workshop in Warwick U.K.
ISBN
1082-9873
Critical Arguements
CA Describes the Warwick Framework, a proposal for linking together the various metadata schemes that may be attached to a given information object by using a system of "packages" and "containers." "[Warwick Workshop] attendees concluded that ... the route to progress on the metadata issue lay in the formulation a higher-level context for the Dublin Core. This context should define how the Core can be combined with other sets of metadata in a manner that addresses the individual integrity, distinct audiences, and separate realms of responsibility of these distinct metadata sets. The result of the Warwick Workshop is a container architecture, known as the Warwick Framework. The framework is a mechanism for aggregating logically, and perhaps physically, distinct packages of metadata. This is a modularization of the metadata issue with a number of notable characteristics. It allows the designers of individual metadata sets to focus on their specific requirements, without concerns for generalization to ultimately unbounded scope. It allows the syntax of metadata sets to vary in conformance with semantic requirements, community practices, and functional (processing) requirements for the kind of metadata in question. It separates management of and responsibility for specific metadata sets among their respective "communities of expertise." It promotes interoperability by allowing tools and agents to selectively access and manipulate individual packages and ignore others. It permits access to the different metadata sets that are related to the same object to be separately controlled. It flexibly accommodates future metadata sets by not requiring changes to existing sets or the programs that make use of them."
Phrases
<P1> The range of metadata needed to describe and manage objects is likely to continue to expand as we become more sophisticated in the ways in which we characterize and retrieve objects and also more demanding in our requirements to control the use of networked information objects. The architecture must be sufficiently flexible to incorporate new semantics without requiring a rewrite of existing metadata sets. <warrant> <P2> Each logically distinct metadata set may represent the interests of and domain of expertise of a specific community. <P3> Just as there are disparate sources of metadata, different metadata sets are used by and may be restricted to distinct communities of users and agents. <P4> Strictly partitioning the information universe into data and metadata is misleading. <P5> If we allow for the fact that metadata for an object consists of logically distinct and separately administered components, then we should also provide for the distribution of these components among several servers or repositories. The references to distributed components should be via a reliable persistent name scheme, such as that proposed for Universal Resources Names (URNs) and Handles. <P6> [W]e emphasize that the existence of a reliable URN implementation is a necessary to avoid the problems of dangling references that plague the Web. <warrant> <P7> Anyone can, in fact, create descriptive data for a networked resource, without permission or knowledge of the owner or manager of that resource. This metadata is fundamentally different from that metadata that the owner of a resource chooses to link or embed with the resource. We, therefore, informally distinguish between two categories of metadata containers, which both have the same implementation [internally referenced and externally referenced metadata containers].
Conclusions
RQ "We run the danger, with the full expressiveness of the Warwick Framework, of creating such complexity that the metadata is effectively useless. Finding the appropriate balance is a central design problem. ... Definers of specific metadata sets should ensure that the set of operations and semantics of those operations will be strictly defined for a package of a given type. We expect that a limited set of metadata types will be widely used and 'understood' by browsers and agents. However, the type system must be extensible, and some method that allows existing clients and agents to process new types must be a part of a full implementation of the Framework. ... There is a need to agree on one or more syntaxes for the various metadata sets. Even in the context of the relatively simple World Wide Web, the Internet is often unbearably slow and unreliable. Connections often fail or time out due to high load, server failure, and the like. In a full implementation of the Warwick Framework, access to a "document" might require negotiation across distributed repositories. The performance of this distributed architecture is difficult to predict and is prone to multiple points of failure. ... It is clear that some protocol work will need to be done to support container and package interchange and retrieval. ... Some examination of the relationship between the Warwick Framework and ongoing work in repository architectures would likely be fruitful.
CA This is the first of four articles describing Geospatial Standards and the standards bodies working on these standards. This article will discuss what geospatial standards are and why they matter, identify major standards organizations, and list the characteristics of successful geospatial standards.
Conclusions
RQ Which federal and international standards have been agreed upon since this article's publication?
SOW
DC FGDC approved the Content Standard for Digital Geospatial Metadata (FGDC-STD-001-1998) in June 1998. FGDC is a 19-member interagency committee composed of representatives from the Executive Office of the President, Cabinet-level and independent agencies. The FGDC is developing the National Spatial Data Infrastructure (NSDI) in cooperation with organizations from State, local and tribal governments, the academic community, and the private sector. The NSDI encompasses policies, standards, and procedures for organizations to cooperatively produce and share geographic data.
Type
Web Page
Title
JISC/NPO studies on the preservation of electronic materials: A framework of data types and formats, and issues affecting the long term preservation of digital material
CA Proposes a framework for preserving digital objects and discusses steps in the preservation process. Addresses a series of four questions: Why preserve? How much? How? And Where? Proposes a "Preservation Complexity Scorecard" to help identify the complexity of preservation needs and the appropriate preservation approach for a given object. "Although a great deal has been discussed and written about digital material preservation, there would appear to be no overall structure which brings together the findings of the numerous contributors to the debate, and allows them to be compared. This Report attempts to provide such a structure, whereby it should be possible to identify the essential elements of the preservation debate and to determine objectively the criticality of the other unresolved issues. This Report attempts to identify the most critical issues and employ them in order to determine their affect [sic] on preservation practice." (p. 5)
Conclusions
RQ "The study concludes that the overall management task in long term preservation is to moderate the pressure to preserve (Step 1) with the constraints dictated by a cost-effective archive (Step 3). This continuing process of moderation is documented through the Scorecard." (p. 6) "The Study overall recommends that a work programme should be started to: (a) Establish a Scorecard approach (to measure preservation complexity), (b) Establish an inventory of archive items (with complexity ratings) and (c) Establish a Technology Watch (to monitor shifts in technology), in order to be able to manage technological change. And in support of this, (a) establish a programme of work to explore the interaction of stakeholders and a four level contextual mode in the preservation process." (p. 6) A four level contextual approach, with data dictionary entry definitions, should be built in order to provide an information structure that will permit the successful retrieval and interpretation of an object in 50 years time. A study should be established to explore the principle of encapsulating documentsusing the four levels of context, stored in a format, possibly encrypted, that can be transferred across technologies and over time. <warrant> (p. 31) A more detailed study should be made of the inter-relationships of the ten stakeholders, and how they can be made to support the long term preservation of digital material. This will be linked to the economics of archive management (the cost model), changes in legislation (Legal Deposit, etc.), the risks of relying on links between National Libraries to maintain collections (threats of wholesale destruction of collections), and loss through viruses (technological turbulence). (p. 36) A technology management trail (within the Scorecard -- see Step 2 of the Framework) should be established before the more complex digital material is stored. This is to ensure that, for an item of digital material, the full extent of the internal interrelationships are understood, and the implications for long term preservation in a variety of successive environments are documented. (p. 37)
SOW
DC "The study is part of a wider programme of studies, funded by the Joint Information Systems Committee ("JISC"). The programme was initiated as a consequence of a two day workshop at Warwick University, in late November 1995. The workshop addressed the Long Term Preservation of Electronic Materials. The attendees represented an important cross-section of academic, librarian, curatorial, managerial and technological interests. 18 potential action points emerged, and these were seen as a basis for initiating further activity. After consultation, JISC agreed to fund a programme of studies." (p. 7) "The programme of studies is guided by the Digital Archive Working Group, which reports to the Management Committee of the National Preservation Office. The programme is administered by the British Library Research and Innovation Centre." (p. 2)
Type
Web Page
Title
CDL Digital Object Standard: Metadata, Content and Encoding
This document addresses the standards for digital object collections for the California Digital Library 1. Adherence to these standards is required for all CDL contributors and may also serve University of California staff as guidelines for digital object creation and presentation. These standards are not intended to address all of the administrative, operational, and technical issues surrounding the creation of digital object collections.
Critical Arguements
CA These standards describe the file formats, storage and access standards for digital objects created by or incorporated into the CDL as part of the permanent collections. They attempt to balance adherence to industry standards, reproduction quality, access, potential longevity and cost.
Conclusions
RQ not applicable
SOW
DC "This is the first version of the CDL Digital Object Standard. This version is based upon the September 1, 1999 version of the CDL's Digital Image Standard, which included recommendations of the Museum Educational Site Licensing Project (MESL), the Library of Congress and the MOA II participants." ... "The Museum Educational Site Licensing Project (MESL) offered a framework for seven collecting institutions, primarily museums, and seven universities to experiment with new ways to distribute visual information--both images and related textual materials. " ... "The Making of America (MoA II) Testbed Project is a Digital Library Federation (DLF) coordinated, multi-phase endeavor to investigate important issues in the creation of an integrated, but distributed, digital library of archival materials (i.e., digitized surrogates of primary source materials found in archives and special collections). The participants include Cornell University, New York Public Library, Pennsylvania State University, Stanford University and UC Berkeley. The Library of Congress white papers and standards are based on the experience gained during the American Memory Pilot Project. The concepts discussed and the principles developed still guide the Library's digital conversion efforts, although they are under revision to accomodate the capabilities of new technologies and new digital formats." ... "The CDL Technical Architecture and Standards Workgroup includes the following members with extensive experience with digital object collection and management: Howard Besser, MESL and MOA II digital imaging testbed projects; Diane Bisom, University of California, Irvine; Bernie Hurley, MOA II, University of California, Berkeley; Greg Janee, Alexandria Digital Library; John Kunze, University of California, San Francisco; Reagan Moore and Chaitanya Baru, San Diego Supercomputer Center, ongoing research with the National Archives and Records Administration on the long term storage and retrieval of digital content; Terry Ryan, University of California, Los Angeles; David Walker, California Digital Library"
The CDISC Submission Metadata Model was created to help ensure that the supporting metadata for these submission datasets should meet the following objectives: Provide FDA reviewers with clear describtions of the usage, structure, contents, and attributes of all datasets and variables; Allow reviewers to replicate most analyses, tables, graphs, and listings with minimal or no transformations; Enable reviewers to easily view and subset the data used to generate any analysis, table, graph, or listing without complex programming. ... The CDISC Submission Metadata Model has been defined to guide sponsors in the preparation of data that is to be submitted to the FDA. By following the principles of this model, sponsors will help reviewers to accurately interpret the contents of submitted data and work with it more effectively, without sacrificing the scientific objectives of clinical development.
Publisher
The Clinical Data Interchange Standards Consortium
Critical Arguements
CA "The CDISC Submission Data Model has focused on the use of effective metadata as the most practical way of establishing meaningful standards applicable to electronic data submitted for FDA review."
Conclusions
RQ "Metadata prepared for a domain (such as an efficacy domain) which has not been described in a CDISC model should follow the general format of the safety domains, including the same set of core selection variables and all of the metadata attributes specified for the safety domains. Additional examples and usage guidelines are available on the CDISC web site at www.cdisc.org." ... "The CDISC Metadata Model describes the structure and form of data, not the content. However, the varying nature of clinical data in general will require the sponsor to make some decisions about how to represent certain real-world conditions in the dataset. Therefore, it is useful for a metadata document to give the reviewer an indication of how the datasets handle certain special cases."
SOW
DC CDISC is an open, multidisciplinary, non-profit organization committed to the development of worldwide standards to support the electronic acquisition, exchange, submission and archiving of clinical trials data and metadata for medical and biopharmaceutical product development. CDISC members work together to establish universally accepted data standards in the pharmaceutical, biotechnology and device industries, as well as in regulatory agencies worldwide. CDISC currently has more than 90 members, including the majority of the major global pharmaceutical companies.
Type
Web Page
Title
CDISC Achieves Two Significant Milestones in the Development of Models for Data Interchange
CA "The Clinical Data Interchange Standards Consortium has achieved two significant milestones towards its goal of standard data models to streamline drug development and regulatory review processes. CDISC participants have completed metadata models for the 12 safety domains listed in the FDA Guidance regarding Electronic Submissions and have produced a revised XML-based data model to support data acquisition and archive."
Conclusions
RQ "The goal of the CDISC XML Document Type Definition (DTD) Version 1.0 is to make available a first release of the definition of this CDISC model, in order to support sponsors, vendors and CROs in the design of systems and processes around a standard interchange format."
SOW
DC "This team, under the leadership of Wayne Kubick of Lincoln Technologies, and Dave Christiansen of Genentech, presented their metadata models to a group of representatives at the FDA on Oct. 10, and discussed future cooperative efforts with Agency reviewers."... "CDISC is a non-profit organization with a mission to lead the development of standard, vendor-neutral, platform-independent data models that improve process efficiency while supporting the scientific nature of clinical research in the biopharmaceutical and healthcare industries"
Type
Web Page
Title
PBCore: Public Broadcasting Metadata Dictionary Project
CA "PBCore is designed to provide -- for television, radio and Web activities -- a standard way of describing and using media (video, audio, text, images, rich interactive learning objects). It allows content to be more easily retrieved and shared among colleagues, software systems, institutions, community and production partners, private citizens, and educators. It can also be used as a guide for the onset of an archival or asset management process at an individual station or institution. ... The Public Broadcasting Metadata Dictionary (PBCore) is: a core set of terms and descriptors (elements) used to create information (metadata) that categorizes or describes media items (sometimes called assets or resources)."
Conclusions
<RQ> The PBCore Metadata Elements are currently in their first published edition, Version 1.0. Over two years of research and lively discussions have generated this version. ... As various users and communities begin to implement the PBCore, updates and refinements to the PBCore are likely to occur. Any changes will be clearly identified, ramifications outlined, and published to our constituents.
SOW
DC "Initial development funding for PBCore was provided by the Corporation for Public Broadcasting. The PBCore is built on the foundation of the Dublin Core (ISO 15836) ... and has been reviewed by the Dublin Core Metadata Initiative Usage Board. ... PBCore was successfully deployed in a number of test implementations in May 2004 in coordination with WGBH, Minnesota Public Radio, PBS, National Public Radio, Kentucky Educational Television, and recognized metadata expert Grace Agnew. As of July 2004 in response to consistent feedback to make metadata standards easy to use, the number of metadata elements was reduced to 48 from the original set of 58 developed by the Metadata Dictionary Team. Also, efforts are ongoing to provide more focused metadata examples that are specific to TV and radio. ... Available free of charge to public broadcasting stations, distributors, vendors, and partners, version 1.0 of PBCore was launched in the first quarter of 2005. See our Licensing Agreement via the Creative Commons for further information. ... Plans are under way to designate an Authority/Maintenance Organization."
Type
Web Page
Title
Metadata for preservation : CEDARS project document AIW01
This report is a review of metadata formats and initiatives in the specific area of digital preservation. It supplements the DESIRE Review of metadata (Dempsey et al. 1997). It is based on a literature review and information picked-up at a number of workshops and meetings and is an attempt to briefly describe the state of the art in the area of metadata for digital preservation.
Critical Arguements
CA "The projects, initiatives and formats reviewed in this report show that much work remains to be done. . . . The adoption of persistent and unique identifiers is vital, both in the CEDARS project and outside. Many of these initiatives mention "wrappers", "containers" and "frameworks". Some thought should be given to how metadata should be integrated with data content in CEDARS. Authenticity (or intellectual preservation) is going to be important. It will be interesting to investigate whether some archivists' concerns with custody or "distributed custody" will have relevance to CEDARS."
Conclusions
RQ Which standards and initiatives described in this document have proved viable preservation metadata models?
SOW
DC OAIS emerged out of an initiative spearheaded by NASA's Consultative Committee for Space Data Systems. It has been shaped and promoted by the RLG and OCLC. Several international projects have played key roles in shaping the OAIS model and adapting it for use in libraries, archives and research repositories. OAIS-modeled repositories include the CEDARS Project, Harvard's Digital Repository, Koninklijke Bibliotheek (KB), the Library of Congress' Archival Information Package for audiovisual materials, MIT's D-Space, OCLC's Digital Archive and TERM: the Texas Email Repository Model.
Type
Web Page
Title
Softening the borderlines of archives through XML - a case study
Archives have always had troubles getting metadata in formats they can process. With XML, these problems are lessening. Many applications today provide the option of exporting data into an application-defined XML format that can easily be post-processed using XSLT, schema mappers, etc, to fit the archives┬┤ needs. This paper highlights two practical examples for the use of XML in the Swiss Federal Archives and discusses advantages and disadvantages of XML in these examples. The first use of XML is the import of existing metadata describing debates at the Swiss parliament whereas the second concerns preservation of metadata in the archiving of relational databases. We have found that the use of XML for metadata encoding is beneficial for the archives, especially for its ease of editing, built-in validation and ease of transformation.
Notes
The Swiss Federal Archives defines the norms and basis of records management and advises departments of the Federal Administration on their implementation. http://www.bar.admin.ch/bar/engine/ShowPage?pageName=ueberlieferung_aktenfuehrung.jsp
Critical Arguements
CA "This paper briefly discusses possible uses of XML in an archival context and the policies of the Swiss Federal Archives concerning this use (Section 2), provides a rough overview of the applications we have that use XML (Section 3) and the experiences we made (Section 4)."
Conclusions
RQ "The systems described above are now just being deployed into real world use, so the experiences presented here are drawn from the development process and preliminary testing. No hard facts in testing the sustainability of XML could be gathered, as the test is time itself. This test will be passed when we can still access the data stored today, including all metadata, in ten or twenty years." ... "The main problem area with our applications was the encoding of the XML documents and the non-standard XML document generation of some applications. When dealing with the different encodings (UTF-8, UTF-16, ISO-8859-1, etc) some applications purported a different encoding in the header of the XML document than the true encoding of the document. These errors were quickly identified, as no application was able to read the documents."
SOW
DC The author is currently a private digital archives consultant, but at the time of this article, was a data architect for the Swiss Federal Archives. The content of this article owes much to the work being done by a team of architects and engineers at the Archives, who are working on an e-government project called ARELDA (Archiving of Electronic Data and Records).
"The ERMS Metadata Standard forms Part 2 of the National Archives' 'Requirements for Electronic Records Management Systems' (commonly known as the '2002 Requirements'). It is specified in a technology independent manner, and is aligned with the e-Government Metadata Standard (e-GMS) version 2, April 2003. A version of e-GMS v2 including XML examples was published in the autumn of 2003. This Guide should be read in conjunction with the ERMS Metadata Standard. Readers may find the GovTalk Schema Guidelines (available via http://www.govtalk.gov.uk ) helpful regarding design rules used in building the schemas."
Conclusions
RQ Electronically enabled processes need to generate appropriate records, according to established records management principles. These records need to reach the ERMS that captures them with enough information to enable the ERMS to classify them appropriately, allocate an appropriate retention policy, etc.
SOW
DC This document is a draft.
Type
Web Page
Title
Recordkeeping Metadata Standard for Commonwealth Agencies
This standard describes the metadata that the National Archives of Australia recommends should be captured in the recordkeeping systems used by Commonwealth government agencies. ... Part One of the standard explains the purpose and importance of standardised recordkeeping metadata and details the scope, intended application and features of the standard. Features include: flexibility of application; repeatability of data elements; extensibility to allow for the management of agency-specific recordkeeping requirements; interoperability across systems environments; compatibility with related metadata standards, including the Australian Government Locator Service (AGLS) standard; and interdependency of metadata at the sub-element level.
Critical Arguements
CA Compliance with the Recordkeeping Metadata Standard for Commonwealth Agencies will help agencies to identify, authenticate, describe and manage their electronic records in a systematic and consistent way to meet business, accountability and archival requirements. In this respect the metadata is an electronic recordkeeping aid, similar to the descriptive information captured in file registers, file covers, movement cards, indexes and other registry tools used in the paper-based environment to apply intellectual and physical controls to records.
Conclusions
RQ "The National Archives intends to consult with agencies, vendors and other interested parties on the implementation and continuing evolution of the Recordkeeping Metadata Standard for Commonwealth Agencies." ... "The National Archives expects to re-examine and reissue the standard in response to broad agency feedback and relevant advances in theory and methodology." ... "The development of public key technology is one area the National Archives will monitor closely, in consultation with the Office for Government Online, for possible additions to a future version of the standard."
SOW
DC "This standard has been developed in consultation with recordkeeping software vendors endorsed by the Office for Government OnlineÔÇÖs Shared Systems Initiative, as well as selected Commonwealth agencies." ... "The standard has also been developed with reference to other metadata standards emerging in Australia and overseas to ensure compatibility, as far as practicable, between related resource management tools, including: the Dublin Core-derived Australian Government Locator Service (AGLS) metadata standard for discovery and retrieval of government services and information in web-based environments, co-ordinated by the National Archives of Australia; and the non-sector-specific Recordkeeping Metadata Standards for Managing and Accessing Information Resources in Networked Environments Over Time for Government, Social and Cultural Purposes, co-ordinated by Monash University using an Australian Research Council Strategic Partnership with Industry Research and Training (SPIRT) Support Grant."
This document is a revision and expansion of "Metadata Made Simpler: A guide for libraries," published by NISO Press in 2001.
Publisher
NISO Press
Critical Arguements
CA An overview of what metadata is and does, aimed at librarians and other information professionals. Describes various metadata schemas. Concludes with a bibliography and glossary.
Type
Web Page
Title
Use of Encoded Archival Description (EAD) for Manuscript Collection Finding Aids
Presented in 1999 to the Library's Collection Development & Management Committee, this report outlines support for implementing EAD in delivery of finding aids for library collections over the Web. It describes the limitations of HTML, provides an introduction to SGML, XML, and EAD, outlines the advantages of conversion from HTML to EAD, the conversion process, the proposed outcome, and sources for further information.
Publisher
National Library of Australia
Critical Arguements
CA As use of the World Wide Web has increased, so has the need of users to be able to discover web-based information resources easily and efficiently, and to be able to repeat that discovery in a consistent manner. Using SGML to mark up web-based documents facilitates such resource discovery.
Conclusions
RQ To what extent have the mainstream web browser companies fulfilled their committment to support native viewing of SGML/XML documents?
Type
Web Page
Title
Preservation Metadata and the OAIS Information Model: A Metadata Framework to Support the Preservation of Digital Objects
CA "In March 2000, OCLC and RLG sponsored the creation of a working group to explore consensus-building in the area of preservation metadata. ... The charge of the group was to pool their expertise and experience to develop a preservation metadata framework applicable to a broad range of digital preservation activities." (p.1) "The OAIS information model offers a broad categorization of the types of information falling under the scope of preservation metadata; it falls short, however, of providing a decomposition of these information types into a list of metadata elements suitable for practical implementation. It is this need that the working group addressed in the course of its activities, the results of which are reported in this paper." (p. 47)
Conclusions
RQ "The metadata framework described in this paper can serve as a foundation for future work in the area of preservation metadata. Issues of particular importance include strategies and best practices for implementing preservation metadata in an archival system; assessing the degree of descriptive richness required by various types of digital preservation activities; developing algorithms for producing preservation metadata automatically; determining the scope for sharing preservation metadata in a cooperative environment; and moving beyond best practice towards an effort at formal standards building in this area." (47)
SOW
DC "[The OCLC and RLG working group] began its work by publishing a white paper entitled Preservation Metadata for Digital Objects: A Review of the State of the Art, which defined and discussed the concept of preservation metadata, reviewed current thinking and practice in the use of preservation metadata, and identified starting points for consensus-building activity in this area. The group then turned its attention to the main focus of its activity -- the collaborative development of a preservation metadata framework. This paper reports the results of the working groupÔÇÖs efforts in that regard." (p. 1-2)
During the past decade, the recordkeeping practices in public and private organizations have been revolutionized. New information technologies from mainframes, to PC's, to local area networks and the Internet have transformed the way state agencies create, use, disseminate, and store information. These new technologies offer a vastly enhanced means of collecting information for and about citizens, communicating within state government and between state agencies and the public, and documenting the business of government. Like other modern organizations, Ohio state agencies face challenges in managing and preserving their records because records are increasingly generated and stored in computer-based information systems. The Ohio Historical Society serves as the official State Archives with responsibility to assist state and local agencies in the preservation of records with enduring value. The Office of the State Records Administrator within the Department of Administrative Services (DAS) provides advice to state agencies on the proper management and disposition of government records. Out of concern over its ability to preserve electronic records with enduring value and assist agencies with electronic records issues, the State Archives has adapted these guidelines from guidelines created by the Kansas State Historical Society. The Kansas State Historical Society, through the Kansas State Historical Records Advisory Board, requested a program development grant from the National Historical Publications and Records Commission to develop policies and guidelines for electronic records management in the state of Kansas. With grant funds, the KSHS hired a consultant, Dr. Margaret Hedstrom, an Associate Professor in the School of Information, University of Michigan and formerly Chief of State Records Advisory Services at the New York State Archives and Records Administration, to draft guidelines that could be tested, revised, and then implemented in Kansas state government.
Notes
These guidelines are part of the ongoing effort to address the electronic records management needs of Ohio state government. As a result, this document continues to undergo changes. The first draft, written by Dr. Margaret Hedstrom, was completed in November of 1997 for the Kansas State Historical Society. That version was reorganized and updated and posted to the KSHS Web site on August 18, 1999. The Kansas Guidelines were modified for use in Ohio during September 2000
Critical Arguements
CA "This publication is about maintaining accountability and preserving important historical records in the electronic age. It is designed to provide guidance to users and managers of computer systems in Ohio government about: the problems associated with managing electronic records, special recordkeeping and accountability concerns that arise in the context of electronic government; archival strategies for the identification, management and preservation of electronic records with enduring value; identification and appropriate disposition of electronic records with short-term value, and
Type
Web Page
Title
Requirements for Electronic Records Management Systems: (2) Metadata Standard
Requirements for Electronic Records Management Systems includes: (1) "Functional Requirements" (http://www.nationalarchives.gov.uk/electronicrecords/reqs2002/pdf/requirementsfinal.pdf); (2) "Metadata Standard" (the subject of this record); (3) Reference Document (http://www.nationalarchives.gov.uk/electronicrecords/reqs2002/pdf/referencefinal.pdf); and (4) "Implementation Guidance: Configuration and Metadata Issues" (http://www.nationalarchives.gov.uk/electronicrecords/reqs2002/pdf/implementation.pdf)
Publisher
Public Records Office, [British] National Archives
Critical Arguements
CA Sets out the implications for records management metadata in compliant systems. It has been agreed with the Office of the e-Envoy that this document will form the basis for an XML schema to support the exchange of records metadata and promote interoperability between ERMS and other systems
SOW
DC The National Archives updated the functional requirements for electronic records management systems (ERMS) in collaboration with the central government records management community during 2002. The revision takes account of developments in cross-government and international standards since 1999.
Type
Web Page
Title
Descriptive Metadata Guidelines for RLG Cultural Materials
To ensure that the digital collections submitted to RLG Cultural Materials can be discovered and understood, RLG has compiled these Descriptive Metadata Guidelines for contributors. While these guidelines reflect the needs of one particular service, they also represent a case study in information sharing across community and national boundaries. RLG Cultural Materials engages a wide range of contributors with different local practices and institutional priorities. Since it is impossible to find -- and impractical to impose -- one universally applicable standard as a submission format, RLG encourages contributors to follow the suite of standards applicable to their particular community (p.1).
Critical Arguements
CA "These guidelines . . . do not set a new standard for metadata submission, but rather support a baseline that can be met by any number of strategies, enabling participating institutions to leverage their local descriptions. These guidelines also highlight the types of metadata that enhance functionality for RLG Cultural Materials. After a contributor submits a collection, RLG maps that description into the RLG Cultural Materials database using the RLG Cultural Materials data model. This ensures that metadata from the various participant communities is integrated for efficient searching and retrieval" (p.1).
Conclusions
RQ Not applicable.
SOW
DC RLG comprises more than 150 research and cultural memory institutions, and RLG Cultural Materials elicits contributions from countless museums, archives, and libraries from around the world that, although they might retain local descriptive standards and metadata schemas, must conform to the baseline standards prescribed in this document in order to integrate into RLG Cultural Materials. Appendix A represents and evaluates the most common metadata standards with which RLG Cultural Materians is able to work.
Type
Web Page
Title
Archiving of Electronic Digital Data and Records in the Swiss Federal Archives (ARELDA): e-government project ARELDA - Management Summary
The goal of the ARELDA project is to find long-term solutions for the archiving of digital records in the Swiss Federal Archives. This includes the accession, the long-term storage, preservation of data, description, and access for the users of the Swiss Federal Archives. It is also coordinated with the basic efforts of the Federal Archives to realize a uniform records management solution in the federal administration and therefore to support the pre-archival creation of documents of archival value for the benefits of the administration as well as of the Federal Archives. The project is indispensable for the long-term execution of the Federal Archives Act; Older IT systems are being replaced by newer ones. A complete migration of the data is sometimes not possible or too expensive; A constant increase of small database applications, built and maintained by people with no IT background; More and more administrative bodies are introducing records and document management systems.
Publisher
Swiss Federal Archives
Publication Location
Bern
Critical Arguements
CA "Archiving in general is a necessary prerequisite for the reconstruction of governmental activities as well as for the principle of legal certainty. It enables citizens to understand governmental activities and ensures a democratic control of the federal administration. And finally are archives a prerequisite for the scientific research, especially in the social and historical fields and ensure the preservation of our cultural heritage. It plays a vital role for an ongoing and efficient records management. A necessary prerequisite for the Federal Archives in the era of the information society will be the system ARELDA (Archiving of Electronic Data and Records)."
Conclusions
RQ "Because of the lack of standard solutions and limited or lacking personal resources for an internal development effort, the realisation of ARELDA will have to be outsourced and the cooperation with the IT division and the Federal Office for Information Technology, Systems and Telecommunication must be intensified. The guidelines for the projects are as follows:
SOW
DC ARELDA is one of the five key projects in the Swiss government's e-government strategy.
Museums and the Online Archive of California (MOAC) builds on existing standards and their implementation guidelines provided by the Online Archive of California (OAC) and its parent organization, the California Digital Library (CDL). Setting project standards for MOAC consisted of interpreting existing OAC/CDL documents and adapting them to the projects specific needs, while at the same time maintaining compliance with OAC/CDL guidelines. The present overview over the MOAC technical standards references both the OAC/CDL umbrella document and the MOAC implementation / adaptation document at the beginning of each section, as well as related resources which provide more detail on project specifications.
Critical Arguements
CA The project implements specifications for digital image production, as well as three interlocking file exchange formats for delivering collections, digital images and their respective metadata. Encoded Archival Description (EAD) XML describes the hierarchy of a collection down to the item-level and traditionally serves for discovering both the collection and the individual items within it. For viewing multiple images associated with a single object record, MOAC utilizes Making of America 2 (MOA2) XML. MOA2 makes the images representing an item available to the viewer through a navigable table of contents; the display mimics the behavior of the analog item by e.g. allowing end-users to browse through the pages of an artist's book. Through the further extension of MOA2 with Text Encoding Initiative (TEI) Lite XML, not only does every single page of the book display in its correct order, but a transcription of its textual content also accompanies the digital images.
Conclusions
RQ "These two instances of fairly significant changes in the project's specifications may serve as a gentle reminder that despite its solid foundation in standards, the MOAC information architecture will continue to face the challenge of an ever-changing technical environment."
SOW
DC The author is Digital Media Developer at the UC Berkeley Art Museum & Pacific Film Archives, a member of the MOAC consortium.